Equivalence of Additive-Combinatorial Linear Inequalities for Shannon Entropy and Differential Entropy
نویسندگان
چکیده
منابع مشابه
Inequalities for Shannon Entropy and Kolmogorov Complexity
It was mentioned by Kolmogorov (1968, IEEE Trans. Inform. Theory 14, 662 664) that the properties of algorithmic complexity and Shannon entropy are similar. We investigate one aspect of this similarity. Namely, we are interested in linear inequalities that are valid for Shannon entropy and for Kolmogorov complexity. It turns out that (1) all linear inequalities that are valid for Kolmogorov com...
متن کاملOn the Estimation of Shannon Entropy
Shannon entropy is increasingly used in many applications. In this article, an estimator of the entropy of a continuous random variable is proposed. Consistency and scale invariance of variance and mean squared error of the proposed estimator is proved and then comparisons are made with Vasicek's (1976), van Es (1992), Ebrahimi et al. (1994) and Correa (1995) entropy estimators. A simulation st...
متن کاملShannon Entropy , Renyi Entropy , and Information
This memo contains proofs that the Shannon entropy is the limiting case of both the Renyi entropy and the Tsallis entropy, or information. These results are also confirmed experimentally. We conclude with some general observations on the utility of entropy measures. A brief summary of the origins of the concept of physical entropy are provided in an appendix.
متن کاملShannon inequalities based on Tsallis relative operator entropy
Tsallis relative operator entropy is defined and then its properties are given. Shannon inequality and its reverse one in Hilbert space operators derived by T.Furuta [4] are extended in terms of the parameter of the Tsallis relative operator entropy. Moreover the generalized Tsallis relative operator entropy is introduced and then several operator inequalities are derived.
متن کاملA Comprehensive Comparison of Shannon Entropy and Smooth Renyi Entropy
We provide a new result that links two crucial entropy notions: Shannon Entropy H1 and collision entropy H2. Our formula gives the worst possible amount of collision entropy in a probability distribution, when its Shannon Entropy is fixed. Our results and techniques used in the proof immediately imply many quantitatively tight separations between Shannon and smooth Renyi entropy, which were pre...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Information Theory
سال: 2018
ISSN: 0018-9448,1557-9654
DOI: 10.1109/tit.2018.2815687